Fat-Fast VG-RAM WNN: A high performance approach
نویسندگان
چکیده
The Virtual Generalizing Random Access Memory Weightless Neural Network (VGRAM WNN) is a type of WNN that only requires storage capacity proportional to the training set. As such, it is an effective machine learning technique that offers simple implementation and fast training – it can be made in one shot. However, the VG-RAM WNN test time for applications that require many training samples can be large, since it increases with the size of the memory of each neuron. In this paper, we present Fat-Fast VG-RAM WNNs. Fat-Fast VG-RAM WNNs employ multi-index chained hashing for fast neuron memory search. Our chained hashing technique increases the VG-RAM memory consumption (fat) but reduces test time substantially (fast), while keeping most of its machine learning performance. To address the memory consumption problem, we employ a data clustering technique to reduce the overall size of the neurons’ memory. This can be achieved by replacing clusters of neurons’ memory by their respective centroid values. With our approach, we were able to reduce VG-RAM WNN test time and memory footprint, while maintaining a high and acceptable machine learning performance. We performed experiments with the Fat-Fast VG-RAM WNN applied to two recognition problems: (i) handwritten digit recognition, and (ii) traffic sign recognition. Our experimental results showed that, in both recognition problems, our new VG-RAM WNN approach was able to run three orders of magnitude faster and consume two orders of magnitude less memory than standard VG-RAM, while experiencing only a small reduction in recognition performance.
منابع مشابه
Automated multi-label text categorization with VG-RAM weightless neural networks
In automated multi-label text categorization, an automatic categorization system should output a label set, whose size is unknown a priori, for each document under analysis. Many machine learning techniques have been used for building such automatic text categorization systems. In this paper, we examine virtual generalizing random access memory weightless neural networks (VG-RAM WNN), an effect...
متن کاملMulti-Label Text Categorization with a Data Correlated VG-RAM Weightless Neural Network
In multi-label text categorization, one or more labels (or categories) can be assigned to a single document. In many such categorization tasks, there can be correlation on the assignment of subsets of the set of categories. This can be exploited to improve machine learning techniques devoted to multi-label text categorization. In this paper, we examine a Virtual Generalizing Random Access Memor...
متن کاملFace Recognition with VG-RAM Weightless Neural Networks
Virtual Generalizing Random Access Memory Weightless Neural Networks (Vg-ram wnn) are effective machine learning tools that offer simple implementation and fast training and test. We examined the performance of Vg-ram wnn on face recognition using a well known face database—the AR Face Database. We evaluated two Vgram wnn architectures configured with different numbers of neurons and synapses p...
متن کاملVG-RAM WNN Approach to Monocular Depth Perception
We have examined Virtual Generalizing Random Access Memory Weightless Neural Networks (VG-RAM WNN) as platform for depth map inference from static monocular images. For that, we have designed, implemented and compared the performance of VG-RAM WNN systems against that of depth estimation systems based on Markov Random Field (MRF) models. While not surpassing the performance of such systems, our...
متن کاملAccurate Wavelet Neural Network for Efficient Controlling of an Active Magnetic Bearing System
Problem statement: The synthesis of a command by the neural network has an excellent advantage over the classical one such as PID. This study presented a fast and accurate Wavelet Neural Network (WNN) approach for efficient controlling of an Active Magnetic Bearing (AMB) system. Approach: The proposed approach combined neural network with the wavelet theory. Wavelet theory may be exploited in d...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neurocomputing
دوره 183 شماره
صفحات -
تاریخ انتشار 2016